Installation

Requirements

The NLP Architect requires Python 3.6+ running on a Linux* or UNIX-based OS (like Mac OS). We recommend using the library with Ubuntu 16.04+.

Before installing the library make sure you has the most recent packages listed below:

Ubuntu* 16.04+ or CentOS* 7.4+ Mac OS X* Description
python-pip pip Tool to install Python dependencies
libhdf5-dev h5py Enables loading of hdf5 formats
pkg-config pkg-config Retrieves information about installed libraries

Note

The default installation of NLP Architect use CPU-based binaries of all deep learning frameworks. Intel Optimized MKL-DNN binaries will be installed if a Linux is detected. GPU backed is supported online on Linux and if a GPU is present. See details below for instructions on how to install each backend.

Installation

Prerequisites

Make sure pip and setuptools and venv are up to date before installing.

pip3 install -U pip setuptools venv

We recommend installing NLP Architect in a virtual environment to self-contain the work done using the library.

To create and activate a new virtual environment (or skip this step and use the wizard below):

python3 -m venv .nlp_architect_env
source .nlp_architect_env/bin/activate

Quick Install

Select the desired configuration of your system:

Install from
Create virtualenv?
Backend
Install in developer mode?

Run the following commands to install NLP Architect:

It is recommended to install NLP Architect in development mode to utilize all its features, examples and solutions.

Install from source

To get started, clone our repository:

git clone https://github.com/NervanaSystems/nlp-architect.git
cd nlp-architect

Selecting a backend

NLP Architect supports CPU, GPU and Intel Optimized Tensorflow (MKL-DNN) backends. Users can select the desired backend using a dedicated environment variable (default: CPU). (MKL-DNN and GPU backends are supported only on Linux)

export NLP_ARCHITECT_BE=CPU/MKL/GPU

Installation

NLP Architect is installed using pip and it is recommended to install in development mode.

Default:

pip3 install .

Development mode:

pip3 install -e .

Once installed, the nlp_architect command provides additional options to work with the library, issue nlp_architect -h to see all options.

Updating NLP Architect

Depending of how you installed NLP Architect to update the library:

From source

git pull origin master

Using pip

pip install -U nlp-architect

Compiling Intel® optimized Tensorflow with MKL-DNN

NLP Architect supports MKL-DNN flavor of Tensorflow out of the box, however, if the user wishes to compile Tensorflow we provide instructions below.

Tensorflow has a guide guide for compiling and installing Tensorflow with with MKL-DNN optimization. Make sure to install all required tools: bazel and python development dependencies.

Alternatively, follow the instructions below to compile and install the latest version of Tensorflow with MKL-DNN:

  • Clone Tensorflow repository from GitHub:

    git clone https://github.com/tensorflow/tensorflow
    cd tensorflow
    
  • Configure Tensorflow for compilation:

    ./configure
    
  • Compile Tensorflow with MKL-DNN:

    bazel build --config=mkl --config=opt //tensorflow/tools/pip_package:build_pip_package
    
  • Create pip package in /tmp/tensorflow_pkg:

    bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg
    
  • Install Tensorflow pip package:

    pip install <tensorflow package name>.whl
    
  • Refer to this guide for specific configuration to get optimal performance when running your model.